A Deterministic Global Optimization Method for Variational Inference

نویسندگان

  • Hachem Saddiki
  • Andrew C. Trapp
  • Patrick Flaherty
چکیده

Variational inference methods for latent variable statistical models have gained popularity because they are relatively fast, can handle large data sets, and have deterministic convergence guarantees. However, in practice it is unclear whether the fixed point identified by the variational inference algorithm is a local or a global optimum. Here, we propose a method for constructing iterative optimization algorithms for variational inference problems that are guaranteed to converge to the -global variational lower bound on the log-likelihood. We derive inference algorithms for two variational approximations to a standard Bayesian Gaussian mixture model (BGMM). We present a minimal data set for empirically testing convergence and show that a variational inference algorithm frequently converges to a local optimum while our algorithm always converges to the globally optimal variational lower bound. We characterize the loss incurred by selecting a nonoptimal variational approximation distribution, suggesting that selection of the approximating variational distribution deserves as much attention as the selection of the original statistical model for a given data set.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deterministic Annealing for Stochastic Variational Inference

Stochastic variational inference (SVI) maps posterior inference in latent variable models to nonconvex stochastic optimization. While they enable approximate posterior inference for many otherwise intractable models, variational inference methods suffer from local optima. We introduce deterministic annealing for SVI to overcome this issue. We introduce a temperature parameter that deterministic...

متن کامل

Stochastic Annealing for Variational Inference

We empirically evaluate a stochastic annealing strategy for Bayesian posterior optimization with variational inference. Variational inference is a deterministic approach to approximate posterior inference in Bayesian models in which a typically non-convex objective function is locally optimized over the parameters of the approximating distribution. We investigate an annealing method for optimiz...

متن کامل

Proximity Variational Inference

Variational inference is a powerful approach for approximate posterior inference. However, it is sensitive to initialization and can be subject to poor local optima. In this paper, we develop proximity variational inference (pvi). pvi is a new method for optimizing the variational objective that constrains subsequent iterates of the variational parameters to robustify the optimization path. Con...

متن کامل

Fuzzy Inference System Approach in Deterministic Seismic Hazard, Case Study: Qom Area, Iran

Seismic hazard assessment like many other issues in seismology is a complicated problem, which is due to a variety of parameters affecting the occurrence of an earthquake. Uncertainty, which is a result of vagueness and incompleteness of the data, should be considered in a rational way. Using fuzzy method makes it possible to allow for uncertainties to be considered. Fuzzy inference system,...

متن کامل

Fuzzy Inference System Approach in Deterministic Seismic Hazard, Case Study: Qom Area, Iran

Seismic hazard assessment like many other issues in seismology is a complicated problem, which is due to a variety of parameters affecting the occurrence of an earthquake. Uncertainty, which is a result of vagueness and incompleteness of the data, should be considered in a rational way. Using fuzzy method makes it possible to allow for uncertainties to be considered. Fuzzy inference system,...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016